In [2], [3] and [6], the information of a random variable ξ with respect to another random variable η is defined as [numerical formula] which is a generalization of Shannon's definition of information. This quantity plays a fundamental role in information theory. The quantity I(ξ, ξ), which is called "the entropy of ξ", is often used. But, when ξ is a continuous random variable, I(ξ, ξ) becomes infinite, and so, sometimes, it is inconvenient to use I(ξ, ξ) as "the entropy of ξ". In [7], the author introduced a new quantity L(P_1, P_2, μ) and, using this, defined the entropy H(P_ξ;λ)=H_λ(ξ) of P_ξ with respect to λ. This definition of H_λ(ξ) is more general than that of "entropy" in [3] and serves as both Shannon's entropy and Wiener's one. ...
Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging...
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
The objective of this note is to report some potentially useful mutual information inequalities. 1 P...
There is no generally accepted definition for conditional Tsallis entropy. The standard definition o...
We live in the information age. Claude Shannon, as the father of the information age, gave us a theo...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
In this communication, we characterize a measure of information of type (α, β, γ) by taking certain ...
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynam...
In recent decades, different definitions of conditional Rényi entropy (CRE) have been introduced. Th...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
Properites of the the conditional entropy are studied and it is shown that Hartley's conditional ent...
The intriguing and still open question concerning the composition law of κ--entropy is here reconsid...
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging...
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
The objective of this note is to report some potentially useful mutual information inequalities. 1 P...
There is no generally accepted definition for conditional Tsallis entropy. The standard definition o...
We live in the information age. Claude Shannon, as the father of the information age, gave us a theo...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
In this communication, we characterize a measure of information of type (α, β, γ) by taking certain ...
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynam...
In recent decades, different definitions of conditional Rényi entropy (CRE) have been introduced. Th...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
Properites of the the conditional entropy are studied and it is shown that Hartley's conditional ent...
The intriguing and still open question concerning the composition law of κ--entropy is here reconsid...
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging...
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
The objective of this note is to report some potentially useful mutual information inequalities. 1 P...